# Low-cost training
PARD Llama 3.2 1B
MIT
PARD is a high-performance speculative decoding method that can convert autoregressive draft models into parallel draft models at low cost, significantly accelerating the inference of large language models.
Large Language Model
Transformers

P
amd
2,219
1
Hyperclovax SEED Text Instruct 0.5B
Other
A Korean-optimized text generation model with instruction-following capability, featuring lightweight design suitable for edge device deployment
Large Language Model
Transformers

H
naver-hyperclovax
7,531
60
Jetmoe 8b
Apache-2.0
JetMoE-8B is an efficient open-source large language model that achieves performance comparable to LLaMA2-7B with a training cost of under $100,000, specifically designed for low-resource environments.
Large Language Model
Transformers

J
jetmoe
1,337
246
Loquace 7B Mistral
Apache-2.0
Loquace is an Italian-speaking, instruction-fine-tuned large language model aimed at promoting the democratization of AI and LLMs in Italy.
Large Language Model
Transformers Other

L
cosimoiaia
17
15
Featured Recommended AI Models